On the Nyström Method for Approximating a Gram Matrix for Improved Kernel-Based Learning

نویسندگان

  • Petros Drineas
  • Michael W. Mahoney
چکیده

A problem for many kernel-based methods is that the amount of computation required to find the solution scales as O(n3), where n is the number of training examples. We develop and analyze an algorithm to compute an easily-interpretable low-rank approximation to an n× n Gram matrix G such that computations of interest may be performed more rapidly. The approximation is of the form G̃k = CW + k C T , where C is a matrix consisting of a small number c of columns of G and Wk is the best rank-k approximation to W , the matrix formed by the intersection between those c columns of G and the corresponding c rows of G. An important aspect of the algorithm is the probability distribution used to randomly sample the columns; we will use a judiciously-chosen and data-dependent nonuniform probability distribution. Let ‖·‖2 and ‖·‖F denote the spectral norm and the Frobenius norm, respectively, of a matrix, and let Gk be the best rank-k approximation to G. We prove that by choosing O(k/ε4) columns

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nyström Approximations for Scalable Face Recognition: A Comparative Study

Kernel principal component analysis (KPCA) is a widelyused statistical method for representation learning, where PCA is performed in reproducing kernel Hilbert space (RKHS) to extract nonlinear features from a set of training examples. Despite the success in various applications including face recognition, KPCA does not scale up well with the sample size, since, as in other kernel methods, it i...

متن کامل

Large Scale Online Kernel Classification

In this work, we present a new framework for large scale online kernel classification, making kernel methods efficient and scalable for large-scale online learning tasks. Unlike the regular budget kernel online learning scheme that usually uses different strategies to bound the number of support vectors, our framework explores a functional approximation approach to approximating a kernel functi...

متن کامل

Fast and Accurate Refined Nyström-Based Kernel SVM

In this paper, we focus on improving the performance of the Nyström based kernel SVM. Although the Nyström approximation has been studied extensively and its application to kernel classification has been exhibited in several studies, there still exists a potentially large gap between the performance of classifier learned with the Nyström approximation and that learned with the original kernel. ...

متن کامل

Improved Bounds for the Nyström Method With Application to Kernel Classification

We develop two approaches for analyzing the approximation error bound for the Nyström method, one based on the concentration inequality of integral operator, and one based on the compressive sensing theory. We show that the approximation error, measured in the spectral norm, can be improved from O(N/ √ m) to O(N/m) in the case of large eigengap, where N is the total number of data points, m is ...

متن کامل

Nystrom Approximation for Sparse Kernel Methods: Theoretical Analysis and Empirical Evaluation

Nyström approximation is an effective approach to accelerate the computation of kernel matrices in many kernel methods. In this paper, we consider the Nyström approximation for sparse kernel methods. Instead of relying on the low-rank assumption of the original kernels, which sometimes does not hold in some applications, we take advantage of the restricted eigenvalue condition, which has been p...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 6  شماره 

صفحات  -

تاریخ انتشار 2005